Entropy-Rate Driven Inference of Stochastic Grammars
نویسنده
چکیده
A new method for inferring specific stochastic grammars is presented. The process called Hybrid Model Learner (HML) applies entropy rate to guide the agglomeration process of type ab->c. Each rule derived from the input sequence is associated with a certain entropy-rate difference. A grammar automatically inferred from an example sequence can be used to detect and recognize similar structures in unknown sequences. Two important schools of thought, that of structuralism and the other of ‘stochasticism’ are discussed, including how these two have met and are influencing current statistical learning methods. It is argued that syntactic methods may provide universal tools to model and describe structures from the very elementary level of signals up to the highest one, that of language.
منابع مشابه
Accurate Computation of the Relative Entropy Between Stochastic Regular Grammars
Works dealing with grammatical inference of stochastic grammars often evaluate the relative entropy between the model and the true grammar by means of large test sets generated with the true distribution. In this paper, an iterative procedure to compute the relative entropy between two stochastic deterministic regular grammars is proposed. Resumé Les travails sur l’inférence de grammaires stoch...
متن کاملAccurate computation of the relative entropybetween stochastic regular
Works dealing with grammatical inference of stochastic grammars often evaluate the relative entropy between the model and the true grammar by means of large test sets generated with the true distribution. In this paper, an iterative procedure to compute the relative entropy between two stochastic deterministic regular grammars is proposed. Resum e Les travails sur l'inf erence de grammaires sto...
متن کاملInference of Markov Chain: AReview on Model Comparison, Bayesian Estimation and Rate of Entropy
This article has no abstract.
متن کاملRelative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain
In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...
متن کاملThe Rate of Entropy for Gaussian Processes
In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...
متن کامل